LNCS Homepage
CD ContentsAuthor IndexSearch

Evolving Features in Neural Networks for System Identification

Yung-Keun Kwon and Byung-Ro Moon

School of Computer Science & Engineering, Seoul National University, Shilim-dong, Kwanak-gu, Seoul, 151-742 Korea
kwon@soar.snu.ac.kr
moon@soar.snu.ac.kr

Abstract. Given N data pairs {Xi, yi}, i=1,2,...,N, where each Xi is an n-dimensional vector of independent variables and yi is a dependent variable, the function approximation problem (FAP) is finding a function that best explains the N pairs of Xi and yi. From the universal approximation theorem and inherent approximation capabilities proved by various researches, artificial neural networks (ANNs) are considered as powerful function approximators. There are two main issues on the feedforward neural networks’ performance. One is to determine its structure. The other issue is to specify the weights of a network that minimizes its error. Genetic algorithm (GA) is a global search technique and is useful for complex optimization problems. So, it has been considered to have potential to reinforce the performance of neural networks. Many researchers tried to optimize the weights of networks using genetic algorithms alone or combined with the backpropagation algorithm. Others also tried to find a good topology that is even more difficult and called a “black art.”

LNCS 3103, p. 404 f.

Full article in PDF


lncs@springer.de
© Springer-Verlag Berlin Heidelberg 2004